Bayesian inference via rejection filtering

نویسندگان

  • Nathan Wiebe
  • Christopher E. Granade
  • Ashish Kapoor
  • Krysta Marie Svore
چکیده

We provide a method for approximating Bayesian inference using rejection sampling. We not only make the process efficient, but also dramatically reduce the memory required relative to conventional methods by combining rejection sampling with particle filtering. We also provide an approximate form of rejection sampling that makes rejection filtering tractable in cases where exact rejection sampling is not efficient. Finally, we present several numerical examples of rejection filtering that show its ability to track time dependent parameters in online settings and also benchmark its performance on MNIST classification problems.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Inference of (Co) Variance Components and Genetic Parameters for Economic Traits in Iranian Holsteins via Gibbs Sampling

The aim of this study was using Bayesian approach via Gibbs sampling (GS) for estimating genetic parameters of production, reproduction and health traits in Iranian Holstein cows. Data consisted of 320666 first- lactation records of Holstein cows from 7696 sires and 260302 dams collected by the animal breeding center of Iran from year 1991 to 2010. (Co) variance components were estimated using ...

متن کامل

Filtering Outliers in Bayesian Optimization

Jarno Vanhatalo, Pasi Jylänki, and Aki Vehtari. Gaussian process regression with Student-t likelihood. In NIPS, pages 1910–1918, 2009. Amar Shah, Andrew Gordon Wilson, and Zoubin Ghahramani. Student-t processes as alternatives to Gaussian processes. In AISTATS, pages 877–885, 2014. Anthony O'Hagan. On outlier rejection phenomena in Bayes inference. Journal of the Royal Statistical Society. Seri...

متن کامل

Monte Carlo Techniques for Bayesian Statistical Inference – A comparative review

In this article, we summariseMonte Carlo simulationmethods commonly used in Bayesian statistical computing. We give descriptions for each algorithm and provide R codes for their implementation via a simple 2-dimensional example. We compare the relative merits of these methods qualitatively by considering their general user-friendliness, and numerically in terms of mean squared error and computa...

متن کامل

Measuring the Hardness of Stochastic Sampling on Bayesian Networks with Deterministic Causalities: the k-Test

Approximate Bayesian inference is NP-hard. Dagum and Luby defined the Local Variance Bound (LVB) to measure the approximation hardness of Bayesian inference on Bayesian networks, assuming the networks model strictly positive joint probability distributions, i.e. zero probabilities are not permitted. This paper introduces the k-test to measure the approximation hardness of inference on Bayesian ...

متن کامل

Adaptive MCMC Methods for Inference on Dis- cretely Observed Affine Jump Diffusion Models

In the present paper we generalize in a Bayesian framework the inferential solution proposed by Eraker, Johannes & Polson (2003) for stochastic volatility models with jumps and affine structure. We will use an adaptive sampling methodology known as Delayed Rejection suggested in Tierney & Mira (1999) in a Markov Chain Monte Carlo settings in order to reduce the asymptotic variance of the estima...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1511.06458  شماره 

صفحات  -

تاریخ انتشار 2015